Block-bootstrapping for noisy data.
نویسندگان
چکیده
BACKGROUND Statistical inference of signals is key to understand fundamental processes in the neurosciences. It is essential to distinguish true from random effects. To this end, statistical concepts of confidence intervals, significance levels and hypothesis tests are employed. Bootstrap-based approaches complement the analytical approaches, replacing the latter whenever these are not possible. NEW METHOD Block-bootstrap was introduced as an adaption of the ordinary bootstrap for serially correlated data. For block-bootstrap, the signals are cut into independent blocks, yielding independent samples. The key parameter for block-bootstrapping is the block length. In the presence of noise, naïve approaches to block-bootstrapping fail. Here, we present an approach based on block-bootstrapping which can cope even with high noise levels. This method naturally leads to an algorithm of block-bootstrapping that is immediately applicable to observed signals. RESULTS While naïve block-bootstrapping easily results in a misestimation of the block length, and therefore in an over-estimation of the confidence bounds by 50%, our new approach provides an optimal determination of these, still keeping the coverage correct. COMPARISON WITH EXISTING METHODS In several applications bootstrapping replaces analytical statistics. Block-bootstrapping is applied to serially correlated signals. Noise, ubiquitous in the neurosciences, is typically neglected. Our new approach not only explicitly includes the presence of (observational) noise in the statistics but also outperforms conventional methods and reduces the number of false-positive conclusions. CONCLUSIONS The presence of noise has impacts on statistical inference. Our ready-to-apply method enables a rigorous statistical assessment based on block-bootstrapping for noisy serially correlated data.
منابع مشابه
Output-only Modal Analysis of a Beam Via Frequency Domain Decomposition Method Using Noisy Data
The output data from a structure is the building block for output-only modal analysis. The structure response in the output data, however, is usually contaminated with noise. Naturally, the success of output-only methods in determining the modal parameters of a structure depends on noise level. In this paper, the possibility and accuracy of identifying the modal parameters of a simply supported...
متن کاملBootstrapping with Noise: an Eeective Regularization Technique
Bootstrap samples with noise are shown to be an eeective smoothness and capacity control technique for training feed-forward networks and for other statistical methods such as generalized additive models. It is shown that noisy bootstrap performs best in conjunction with weight decay regularization and ensemble averaging. The two-spiral problem, a highly non-linear noise-free data, is used to d...
متن کاملBootstrapping I(1) data
A functional law is given for an I(1) sample data version of the continuous-path block bootstrap of Paparoditis and Politis (2001a). The results provide an alternative demonstration that continuous-path block bootstrap unit root tests are consistent under the null. © 2010 Elsevier B.V. All rights reserved.
متن کاملBootstrapping and Evaluating Named Entity Recognition in the Biomedical Domain
We demonstrate that bootstrapping a gene name recognizer for FlyBase curation from automatically annotated noisy text is more effective than fully supervised training of the recognizer on more general manually annotated biomedical text. We present a new test set for this task based on an annotation scheme which distinguishes gene names from gene mentions, enabling a more consistent annotation. ...
متن کاملImproving the performance of MFCC for Persian robust speech recognition
The Mel Frequency cepstral coefficients are the most widely used feature in speech recognition but they are very sensitive to noise. In this paper to achieve a satisfactorily performance in Automatic Speech Recognition (ASR) applications we introduce a noise robust new set of MFCC vector estimated through following steps. First, spectral mean normalization is a pre-processing which applies to t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Journal of neuroscience methods
دوره 219 2 شماره
صفحات -
تاریخ انتشار 2013